Relative α-entropy minimizers subject to linear statistical constraints
نویسندگان
چکیده
We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relative entropy, these relative αentropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimization of Iα(P,Q) over the first argument on a set of probability distributions that constitutes a linear family is studied. Such a minimization generalizes the maximum Rényi or Tsallis entropy principle. The minimizing probability distribution (termed Iα-projection) for a linear family is shown to have a power-law.
منابع مشابه
Relative $\alpha$-Entropy Minimizers Subject to Linear Statistical Constraints
We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...
متن کاملMinimization Problems Based on Relative $\alpha$-Entropy II: Reverse Projection
In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted Iα) were studied. Such minimizers were called forward Iα-projections. Here, a complementary class of minimization problems leading to the so-called reverse Iα-projections are studied. Reverse Iα-projections, particularly on log-convex or power-law families, are of interest ...
متن کاملMinimization Problems Based on a Parametric Family of Relative Entropies I: Forward Projection
Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α-entropies (denoted Iα), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual r...
متن کاملCan Entropic Regularization Be Replaced by Squared Euclidean Distance Plus Additional Linear Constraints
There are two main families of on-line algorithms depending on whether a relative entropy or a squared Euclidean distance is used as a regularizer. The difference between the two families can be dramatic. The question is whether one can always achieve comparable performance by replacing the relative entropy regularization by the squared Euclidean distance plus additional linear constraints. We ...
متن کاملComment on “From Plant Traits to Plant Communities: A Statistical Mechanistic Approach to Biodiversity”
Shipley et al. (Reports, 3 November 2006, p. 812) predicted plant community composition and relative abundances with a high level of accuracy by maximizing Shannon’s index of information entropy (species diversity), subject to constraints on plant trait averages. We show that the entropy maximization assumption is relatively unimportant and that the high accuracy is due largely to a statistical...
متن کامل